Linear Separability and Concept Learning: Eyetracking Individual Differences

نویسنده

  • Aaron B. Hoffman
چکیده

Asking people whether a tomato is fruit, or whether chess is a sport, reveals differences between individuals’ concepts. How can the same perceptual information result in different representations? Perhaps people’s initial attention to particular category information determines what representations they will form. When first learning about chess, for example, attention to its competitive aspect may cause one to classify it as a sport, whereas attention to its non-athletic nature will cause one to classify it as a game. Early research focused not on individual differences but rather on questions such as whether people can learn nonlinearly separable (NLS) categories, (as predicted by exemplar memorization), or whether they favor linearly separable (LS) categories (as predicted by prototype abstraction). Indeed, it is generally believed that this question was laid to rest by Medin and Schwanenflugel (1981), who found that NLS can be learned as easily as LS categories. However, Blair and Homa (2001) in fact found individual differences, with some subjects favoring LS, others NLS categories, and others neither. The present study investigates why individuals might prefer LS over NLS categories, or vice versa. We do this in a novel and direct way, by using categories with independently predictive dimensions that provide an LS solution, and additional, configurally predictive dimensions that provide an NLS solution.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Separability is a Learner’s Best Friend

Geometric separability is a generalisation of linear separability, familiar to many from Minsky and Papert’s analysis of the Perceptron learning method. The concept forms a novel dimension along which to conceptualise learning methods. The present paper shows how geometric separability can be defined and demonstrates that it accurately predicts the performance of a at least one empirical learni...

متن کامل

دسته‌بندی داده‌های دورده‌ای با ابرمستطیل موازی محورهای مختصات

One of the machine learning tasks is supervised learning. In supervised learning we infer a function from labeled training data. The goal of supervised learning algorithms is learning a good hypothesis that minimizes the sum of the errors. A wide range of supervised algorithms is available such as decision tress, SVM, and KNN methods. In this paper we focus on decision tree algorithms. When we ...

متن کامل

Induction of Linear Separability through the Ranked Layers of Binary Classifiers

The concept of linear separability is used in the theory of neural networks and pattern recognition methods. This term can be related to examination of learning sets (classes) separation by hyperplanes in a given feature space. The family of K disjoined learning sets can be transformed into K linearly separable sets by the ranked layer of binary classifiers. Problems of the ranked layers deigni...

متن کامل

A Novel and Efficient Method for Testing Non Linear Separability

The notion of linear separability is widely used in machine learning research. Learning algorithms that use this concept to learn include neural networks (Single Layer Perceptron and Recursive Deterministic Perceptron), and kernel machines (Support Vector Machines). Several algorithms for testing linear separability exist. Some of these methods are computationally intense. Also, several of them...

متن کامل

Expanding the search for a linear separability constraint on category learning.

Formal models of categorization make different predictions about the theoretical importance of linear separability. Prior research, most of which has failed to find support for a linear separability constraint on category learning, has been conducted using tasks that involve learning two categories with a small number of members. The present experiment used four categories with three or nine pa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006